Daily-Dose

Contents

From New Yorker

From Vox

One study looked at what happened when Twitter banned extremist alt-right influencers Alex Jones, Milo Yiannopoulos, and Owen Benjamin. Jones was banned from Twitter in 2018 for what the company found to be “abusive behavior,” Yiannopolous was banned in 2016 for harassing Ghostbusters actress Leslie Jones, and Benjamin lost access in 2018 for harassing a Parkland shooting survivor. The study, which examined posts referencing these influencers in the six months after their bans, found that references dropped by an average of nearly 92 percent on the platforms they were banned from.

The study also found that the influencers’ followers who remained on Twitter exhibited a modest but statistically significant drop of about 6 percent in the “toxicity” levels of their subsequent tweets, according to an industry standard called Perspective API. It defines a toxic comment as “a rude, disrespectful, or unreasonable comment that is likely to make you leave a discussion.”

Researchers also found that after Twitter banned influencers, users also talked less about popular ideologies promoted by those influencers. For example, Jones was one of the leading propagators of the false conspiracy theory that the Sandy Hook school shooting was staged. Researchers ran a regression model to measure if mentions of Sandy Hook dropped due to Jones’s ban, and found it decreased by an estimated 16 percent over the course of six months since his ban.

“Many of the most offensive ideas that these influencers were propagating reduced in their prevalence after the deplatforming. So that’s good news,” said Shagun Jhaver, a professor of library and information science at Rutgers University who co-authored the study.

Another study from 2020 looked at the effects of Reddit banning the subreddit r/The_Donald, a popular forum for Trump supporters that was shut down in 2020 after moderators failed to control anti-Semitism, misogyny, and other hateful content being shared. Also banned was the subreddit r/incels, an “involuntary celibate” community that was shut down in 2017 for hosting violent content. The study found that the bans significantly reduced the overall number of active users, newcomers, and posts on the new platforms that those followers moved to, such as 4Chan and Gab. These users also posted with less frequency on average on the new platform.

But the study also found that for the subset of users who did move to fringe platforms, their “toxicity” levels — those negative social behaviors such as incivility, harassment, trolling, and cyberbullying — increased on average.

In particular, the study found evidence that users in the r/The_Donald community who migrated to the alternative website — thedonald.win — became more toxic, negative, and hostile when talking about their “objects of fixation,” such as Democrats and leftists.

The study supports the idea that there is an inherent trade-off with deplatforming extremism: You might reduce the size of the extremist communities, but possibly at the expense of making the remaining members of those communities even more extreme.

“We know that deplatforming works, but we have to accept that there’s no silver bullet,” said Cassie Miller, a senior research analyst at the Southern Poverty Law Center who studies extremist domestic movements. “Tech companies and government are going to have to continually adapt.”

 Mathieu Lewis-Rolland/AFP via Getty Images
A member of the Proud Boys makes an “okay” sign with his hand to symbolize “white power” as he gathers with others in front of the Oregon State Capitol in Salem during a far-right rally on January 8.

All of the six extremist researchers Recode spoke with said that they’re worried about the more insular, localized, and radical organizing happening on fringe networks.

“We’ve had our eyes so much on national-level actions and organizing that we’re losing sight of the really dangerous activities that are being organized more quietly on these sites at the state and local level,” Tromble told Recode.

Some of this alarming organizing is still happening on Facebook, but it’s often flying under the radar in private Facebook Groups, which can be harder for researchers and the public to detect.

Meta — the parent company of Facebook — told Recode that the increased enforcement and strength of its policies cracking down on extremists have been effective in reducing the overall volume of violent and hateful speech on its platform.

“This is an adversarial space and we know that our work to protect our platforms and the people who use them from these threats never ends. However, we believe that our work has helped to make it harder for harmful groups to organize on our platforms,” said David Tessler, a public policy manager at Facebook.

Facebook also said that, according to its own research, when the company made disruptions that targeted hate groups and organizations, there was a short-term backlash among some audience members. The backlash eventually faded, resulting in an overall reduction of hateful content. Facebook declined to share a copy of its research, which it says is ongoing, with Recode.

Twitter declined to comment on any impact it has seen around content regarding the extremist groups QAnon, Proud Boys, or boogaloos since their suspensions from its platform, but shared the following statement: “We continue to enforce the Twitter Rules, prioritizing [taking down] content that has the potential to lead to real-world harm.”

Will the rules of deplatforming apply equally to everyone?

In the past several years, extremist ideology and conspiracy theories have increasingly penetrated mainstream US politics. At least 36 candidates running for Congress in 2022 believe in QAnon, the majority of Republicans say they believe in the false conspiracy theory that the 2020 election was stolen from Trump, and one in four Americans says violence against the government is sometimes justified. The ongoing test for social media companies will be whether they’ve learned lessons from dealing with the extremist movements that spread on their platforms, and if they will effectively enforce their rules, even when dealing with politically powerful figures.

While Twitter and Facebook were long hesitant to moderate Trump’s accounts, they decided to ban him after he refused to concede his loss in the election, then used social media to egg on the violent protesters at the US Capitol. (In Facebook’s case, the ban is only until 2023.) Meanwhile, there are plenty of other major figures in conservative politics and the Republican Party who are active on social media and continue to propagate extremist conspiracy theories.

 Stefani Reynolds/Getty Images
Rep. Marjorie Taylor Green wears a mask reading “CENSORED” at the US Capitol on January 13, 2021.

For example, even some members of Congress, like Rep. Marjorie Taylor Greene (R-GA), have used their Twitter and Facebook accounts to broadcast extremist ideologies, like the “Great Replacement” white nationalist theory, falsely asserting that there is a “Zionist” plot to replace people of European ancestry with other minorities in the West.

In January, Twitter banned Greene’s personal account after she repeatedly broke its content policies by sharing misinformation about Covid-19. But she continues to have an active presence on her work Twitter account and on Facebook.

Choosing to ban groups like the Proud Boys or QAnon seemed to be a more straightforward choice for social media companies; banning an elected official is more complicated. Lawmakers have regulatory power, and conservatives have long claimed that social media networks like Facebook and Twitter are biased against them, even though these platforms often promote conservative figures and speech.

“As more mainstream figures are saying the types of things that normally extremists were the ones saying online, that’s where the weak spot is, because a platform like Facebook doesn’t want to be in the business of moderating ideology,” Holt told Recode. “Mainstream platforms are getting better at enforcing against extremism, but they have not figured out the solution entirely.”

From The Hindu: Sports

From The Hindu: National News

From BBC: Europe

From Ars Technica

From Jokes Subreddit